Eyesweb: Toward gesture and affect recognition in interactive dance and music systems
Abstract
The EyesWeb project concerns the development of a system for real-time analysis of full-body movement and gesture of one or more humans, with a particular focus to affect or emotion content. Such information is used to control and generate sound, music, visual media, and to control actuators (e.g., robots). A main goal is to explore extensions of music language toward gesture and visual languages. The paper describes the state of the art of the project, the design of the current EyesWeb system modules (hardware and software), their experimenting in public events, and discusses ongoing developments.
EyesWebプロジェクトは、全身の動きや1人以上の人間の身振りをリアルタイムで分析するためのシステムの開発に関するものです。 そのような情報は、音、音楽、視覚媒体を制御および生成するため、ならびにアクチュエータ(例えば、ロボット)を制御するために使用される。 主な目的は、ジェスチャー言語とビジュアル言語に向けた音楽言語の拡張を探求することです。 この論文では、プロジェクトの最新技術、現在のEyesWebシステムモジュール(ハードウェアとソフトウェア)の設計、それらの公共イベントでの実験、および現在進行中の開発について説明します。
EyesWeb is an open software research platform for the design and development of real-time multimodal systems and interfaces.
Description
EyesWeb is an open platform to support the design and development of real-time multimodal systems and interfaces. It supports a wide number of input devices including motion capture systems, various types of professional and low cost videocameras, game interfaces (e.g., Kinect, Wii), multichiannel audio input (e.g. microphones), analog inputs (e.g. for physiological signals). Supported outputs include multichannel audio, video, analog devices, robotic platforms. Various standards are supported, including OSC, MIDI, FreeFrame and VST plugins, ASIO, Motion Capture standards and systems (Qualisys), Matlab. EyesWeb supports real-time synchronized recordings of multimodal channels, and includes a number of software libraries, including the Non-Verbal Expressive Gesture Analysis and the Non-Verbal Social Signals Analysis libs. Users can develop proprietary software libs using the EyesWeb development environment. The EyesWeb software includes a development environment, a distributed run-time system (supporting Windows, Linux, and mobile platforms) to create distributed or networked real-time applications, and an open set of libraries of reusable software components. The development environment supports the design process of multimodal interactive systems, enabling users to build systems by means of a visual programming language, which presents some analogies with computer music languages inspired to analog synthesizers or to software systems like Simulink. EyesWeb is conceived, designed and developed by InfoMus Lab. The EyesWeb project started in 1997, as a natural evolution of the HARP Project. The current release of the open software platform is EyesWeb XMI (eXtended Multimodal Interaction). The EyesWeb software platform has been adopted in EU projects in the 5th, 6th and 7th Framework Programme (ICT), and by thousands of users worldwide for scientific research, education, and industry applications. For example, EyesWeb was selected by INTEL in 2008 for their hardware for "independent living", and was adopted at the New York University Summer Program on "Music, dance and new technologies" (2004 - 2006).
Computer Music Journal, MIT Press, v. 24, n. 1, pp. 57-69, 2000